20,373 research outputs found
Local order and magnetic field effects on the electronic properties of disordered binary alloys in the Quantum Site Percolation limit
Electronic properties of disordered binary alloys are studied via the
calculation of the average Density of States (DOS) in two and three dimensions.
We propose a new approximate scheme that allows for the inclusion of local
order effects in finite geometries and extrapolates the behavior of infinite
systems following `finite-size scaling' ideas. We particularly investigate the
limit of the Quantum Site Percolation regime described by a tight-binding
Hamiltonian. This limit was chosen to probe the role of short range order (SRO)
properties under extreme conditions. The method is numerically highly efficient
and asymptotically exact in important limits, predicting the correct DOS
structure as a function of the SRO parameters. Magnetic field effects can also
be included in our model to study the interplay of local order and the shifted
quantum interference driven by the field. The average DOS is highly sensitive
to changes in the SRO properties, and striking effects are observed when a
magnetic field is applied near the segregated regime. The new effects observed
are twofold: there is a reduction of the band width and the formation of a gap
in the middle of the band, both as a consequence of destructive interference of
electronic paths and the loss of coherence for particular values of the
magnetic field. The above phenomena are periodic in the magnetic flux. For
other limits that imply strong localization, the magnetic field produces minor
changes in the structure of the average DOS.Comment: 13 pages, 9 figures, 31 references, RevTex preprint, submitted to
Phys. Rev.
Kinect as an access device for people with cerebral palsy: A preliminary study
Cerebral palsy (CP) describes a group of disorders affecting the development of movement and posture, causingactivity limitation. Access to technology can alleviate some of these limitations. Many studies have used vision- based movement capture systems to overcome problems related to discomfort and fear of wearing devices. Incontrast, there has been no research assessing the behavior of vision-based movement capture systems in peoplewith involuntary movements. In this paper, we look at the potential of the Kinect sensor as an assistive technologyfor people with cerebral palsy. We developed a serious game, called KiSens Números, to study the behavior ofKinect in this context and eighteen subjects with cerebral palsy used it to complete a set of sessions. The resultsof the experiments show that Kinect filters some of peoples involuntary movements, confirming the potential ofKinect as an assistive technology for people with motor disabilities
Differentiability of correlations in Realistic Quantum Mechanics
We prove a version of Bell's Theorem in which the Locality assumption is
weakened. We start by assuming theoretical quantum mechanics and weak forms of
relativistic causality and of realism (essentially the fact that observable
values are well defined independently of whether or not they are measured).
Under these hypotheses, we show that only one of the correlation functions that
can be formulated in the framework of the usual Bell theorem is unknown. We
prove that this unknown function must be differentiable at certain angular
configuration points that include the origin. We also prove that, if this
correlation is assumed to be twice differentiable at the origin, then we arrive
at a version of Bell's theorem. On the one hand, we are showing that any
realistic theory of quantum mechanics which incorporates the kinematic aspects
of relativity must lead to this type of \emph{rough} correlation function that
is once but not twice differentiable. On the other hand, this study brings us a
single degree of differentiability away from a relativistic von Neumann no
hidden variables theorem.Comment: Final version, published in JM
MSSM Forecast for the LHC
We perform a forecast of the MSSM with universal soft terms (CMSSM) for the
LHC, based on an improved Bayesian analysis. We do not incorporate ad hoc
measures of the fine-tuning to penalize unnatural possibilities: such
penalization arises from the Bayesian analysis itself when the experimental
value of is considered. This allows to scan the whole parameter space,
allowing arbitrarily large soft terms. Still the low-energy region is
statistically favoured (even before including dark matter or g-2 constraints).
Contrary to other studies, the results are almost unaffected by changing the
upper limits taken for the soft terms. The results are also remarkable stable
when using flat or logarithmic priors, a fact that arises from the larger
statistical weight of the low-energy region in both cases. Then we incorporate
all the important experimental constrains to the analysis, obtaining a map of
the probability density of the MSSM parameter space, i.e. the forecast of the
MSSM. Since not all the experimental information is equally robust, we perform
separate analyses depending on the group of observables used. When only the
most robust ones are used, the favoured region of the parameter space contains
a significant portion outside the LHC reach. This effect gets reinforced if the
Higgs mass is not close to its present experimental limit and persits when dark
matter constraints are included. Only when the g-2 constraint (based on
data) is considered, the preferred region (for ) is well inside
the LHC scope. We also perform a Bayesian comparison of the positive- and
negative- possibilities.Comment: 42 pages: added figures and reference
Boltzmann entropy of a Newtonian Universe
A dynamical estimate is given for the Boltzmann entropy of the Universe,
under the simplifying assumptions provided by Newtonian cosmology. We first
model the cosmological fluid as the probability fluid of a quantum-mechanical
system. Next, following current ideas about the emergence of spacetime, we
regard gravitational equipotentials as isoentropic surfaces. Therefore
gravitational entropy is proportional to the vacuum expectation value of the
gravitational potential in a certain quantum state describing the matter
contents of the Universe. The entropy of the matter sector can also be
computed. While providing values of the entropy that turn out to be somewhat
higher than existing estimates, our results are in perfect compliance with the
upper bound set by the holographic principle.Comment: 15 page
The health of SUSY after the Higgs discovery and the XENON100 data
We analyze the implications for the status and prospects of supersymmetry of
the Higgs discovery and the last XENON data. We focus mainly, but not only, on
the CMSSM and NUHM models. Using a Bayesian approach we determine the
distribution of probability in the parameter space of these scenarios. This
shows that, most probably, they are now beyond the LHC reach . This negative
chances increase further (at more than 95% c.l.) if one includes dark matter
constraints in the analysis, in particular the last XENON100 data. However, the
models would be probed completely by XENON1T. The mass of the LSP neutralino
gets essentially fixed around 1 TeV. We do not incorporate ad hoc measures of
the fine-tuning to penalize unnatural possibilities: such penalization arises
automatically from the careful Bayesian analysis itself, and allows to scan the
whole parameter space. In this way, we can explain and resolve the apparent
discrepancies between the previous results in the literature. Although SUSY has
become hard to detect at LHC, this does not necessarily mean that is very
fine-tuned. We use Bayesian techniques to show the experimental Higgs mass is
at off the CMSSM or NUHM expectation. This is substantial but
not dramatic. Although the CMSSM or the NUHM are unlikely to show up at the
LHC, they are still interesting and plausible models after the Higgs
observation; and, if they are true, the chances of discovering them in future
dark matter experiments are quite high
LHC and dark matter phenomenology of the NUGHM
We present a Bayesian analysis of the NUGHM, a supersymmetric scenario with
non-universal gaugino masses and Higgs masses, including all the relevant
experimental observables and dark matter constraints. The main merit of the
NUGHM is that it essentially includes all the possibilities for dark matter
(DM) candidates within the MSSM, since the neutralino and chargino spectrum
-and composition- are as free as they can be in the general MSSM. We identify
the most probable regions in the NUHGM parameter space, and study the
associated phenomenology at the LHC and the prospects for DM direct detection.
Requiring that the neutralino makes all of the DM in the Universe, we identify
two preferred regions around ,
which correspond to the (almost) pure Higgsino and wino case. There exist other
marginal regions (e.g. Higgs-funnel), but with much less statistical weight.
The prospects for detection at the LHC in this case are quite pessimistic, but
future direct detection experiments like LUX and XENON1T, will be able to probe
this scenario. In contrast, when allowing other DM components, the prospects
for detection at the LHC become more encouraging -- the most promising signals
being, beside the production of gluinos and squarks, the production of the
heavier chargino and neutralino states, which lead to WZ and same-sign WW final
states -- and direct detection remains a complementary, and even more powerful,
way to probe the scenario.Comment: The Sommerfeld enhancement has been included in the computation of
the relic density and in the discussion of indirect-detection limits. Some
references have been adde
- …